A Bidirectional LSTM Language Model for Code Evaluation and Repair

نویسندگان

چکیده

Programming is a vital skill in computer science and engineering-related disciplines. However, developing source code an error-prone task. Logical errors are particularly hard to identify for both students professionals, single error unexpected end-users. At present, conventional compilers have difficulty identifying many of the (especially logical errors) that can occur code. To mitigate this problem, we propose language model evaluating codes using bidirectional long short-term memory (BiLSTM) neural network. We trained BiLSTM with large number tuning various hyperparameters. then used evaluate incorrect assessed model’s performance three principal areas: detection, suggestions repair, erroneous classification. Experimental results showed proposed achieved 50.88% correctness providing suggestions. Moreover, F-score approximately 97%, outperforming other state-of-the-art models (recurrent networks (RNNs) (LSTM)).

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

DR-BiLSTM: Dependent Reading Bidirectional LSTM for Natural Language Inference

We present a novel deep learning architecture to address the natural language inference (NLI) task. Existing approaches mostly rely on simple reading mechanisms for independent encoding of the premise and hypothesis. Instead, we propose a novel dependent reading bidirectional LSTM network (DR-BiLSTM) to efficiently model the relationship between a premise and a hypothesis during encoding and in...

متن کامل

Learning Natural Language Inference using Bidirectional LSTM model and Inner-Attention

In this paper, we proposed a sentence encoding-based model for recognizing text entailment. In our approach, the encoding of sentence is a two-stage process. Firstly, average pooling was used over word-level bidirectional LSTM (biLSTM) to generate a firststage sentence representation. Secondly, attention mechanism was employed to replace average pooling on the same sentence for better represent...

متن کامل

A Bidirectional Model For Natural Language Processing

In this paper* I will argue for a model of grammatical processing that is based on uniform processing and knowledge sources. The main feature of this model is to view parsing and generation as two strongly interleaved tasks performed by a single parametrized deduction process. It will be shown that this view supports flexible and efficient natural language processing.

متن کامل

Bidirectional LSTM-CRF Models for Sequence Tagging

In this paper, we propose a variety of Long Short-Term Memory (LSTM) based models for sequence tagging. These models include LSTM networks, bidirectional LSTM (BI-LSTM) networks, LSTM with a Conditional Random Field (CRF) layer (LSTM-CRF) and bidirectional LSTM with a CRF layer (BI-LSTM-CRF). Our work is the first to apply a bidirectional LSTM CRF (denoted as BI-LSTM-CRF) model to NLP benchmark...

متن کامل

Bidirectional LSTM-CRF for Clinical Concept Extraction

Automated extraction of concepts from patient clinical records is an essential facilitator of clinical research. For this reason, the 2010 i2b2/VA Natural Language Processing Challenges for Clinical Records introduced a concept extraction task aimed at identifying and classifying concepts into predefined categories (i.e., treatments, tests and problems). State-of-the-art concept extraction appr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Symmetry

سال: 2021

ISSN: ['0865-4824', '2226-1877']

DOI: https://doi.org/10.3390/sym13020247